18 research outputs found

    Computational intelligent sensor-rank consolidation approach for Industrial Internet of Things (IIoT).

    Get PDF
    Continues field monitoring and searching sensor data remains an imminent element emphasizes the influence of the Internet of Things (IoT). Most of the existing systems are concede spatial coordinates or semantic keywords to retrieve the entail data, which are not comprehensive constraints because of sensor cohesion, unique localization haphazardness. To address this issue, we propose deep-learning-inspired sensor-rank consolidation (DLi-SRC) system that enables 3-set of algorithms. First, sensor cohesion algorithm based on Lyapunov approach to accelerate sensor stability. Second, sensor unique localization algorithm based on rank-inferior measurement index to avoid redundancy data and data loss. Third, a heuristic directive algorithm to improve entail data search efficiency, which returns appropriate ranked sensor results as per searching specifications. We examined thorough simulations to describe the DLi-SRC effectiveness. The outcomes reveal that our approach has significant performance gain, such as search efficiency, service quality, sensor existence rate enhancement by 91%, and sensor energy gain by 49% than benchmark standard approaches

    Enhancing the Access Privacy of IDaaS System Using SAML Protocol in Fog Computing

    Get PDF
    Fog environment adoption rate is increasing day by day in the industry. Unauthorized accessing of data occurs due to the preservation of Identity and information of the users either at the endpoints or at the middleware. This paper proposes a methodology to protect and preserve the Identity during data transmission of the users. It uses fog computing for storage against security issues in the cloud and database environment. Cloud and database architectures failed to protect the data and Identity of users but the Fog computing based Identity management as a service (IDaaS) system can handle it with Security Assertion Mark-up Language (SAML) protocol and Pentatope based Elliptic Curve Crypto cipher. A detailed comparative study of the proposed and existing techniques is investigated by considering multi-authentication dialogue, security services, service providers, Identity, and access management

    DAWM: cost-aware asset claim analysis approach on big data analytic computation model for cloud data centre.

    Get PDF
    The heterogeneous resource-required application tasks increase the cloud service provider (CSP) energy cost and revenue by providing demand resources. Enhancing CSP profit and preserving energy cost is a challenging task. Most of the existing approaches consider task deadline violation rate rather than performance cost and server size ratio during profit estimation, which impacts CSP revenue and causes high service cost. To address this issue, we develop two algorithms for profit maximization and adequate service reliability. First, a belief propagation-influenced cost-aware asset scheduling approach is derived based on the data analytic weight measurement (DAWM) model for effective performance and server size optimization. Second, the multiobjective heuristic user service demand (MHUSD) approach is formulated based on the CPS profit estimation model and the user service demand (USD) model with dynamic acyclic graph (DAG) phenomena for adequate service reliability. The DAWM model classifies prominent servers to preserve the server resource usage and cost during an effective resource slicing process by considering each machine execution factor (remaining energy, energy and service cost, workload execution rate, service deadline violation rate, cloud server configuration (CSC), service requirement rate, and service level agreement violation (SLAV) penalty rate). The MHUSD algorithm measures the user demand service rate and cost based on the USD and CSP profit estimation models by considering service demand weight, tenant cost, and energy cost. The simulation results show that the proposed system has accomplished the average revenue gain of 35%, cost of 51%, and profit of 39% than the state-of-the-art approaches

    Survival Study on Blockchain Based 6G-Enabled Mobile Edge Computation for IoT Automation

    Get PDF
    Internet of Things (IoT) and Mobile Edge Computing (MEC) technology acts as a significant part of daily lives to facilitate control and monitoring of objects to revolutionize the ways that human interacts with physical world. IoT system includes large volume of data with network connectivity, power, and storage resources to transform data into meaningful information. Blockchain has decentralized nature to provide useful mechanism for addressing IoT challenges. Blockchain is distributed ledger with fundamental attributes, namely recorded, transparent, and decentralized. Blockchain formed participants in distributed ledger to record the transactions and communicate with other through trustless method. Security is considered as the most valuable features of Blockchain. IoT and Blockchain are emerging ideas for creating the applications to share the intrinsic features. Several existing works has been developed for the integration of blockchain with IoT. But, Blockchain protocols in the state-of-the-art works with IoT failed to consider the computational loads, delays, and bandwidth overhead which lead to new set of problems. The review estimates main challenges in integration of Blockchain and IoT technologies to attain high-level solutions by addressing the shortcomings and limitations of IoT and Blockchain technologies

    Computational Intelligent Sensor-rank Consolidation Approach for Industrial Internet of Things (IIoT)

    Get PDF
    Continues field monitoring and searching sensor data remains an imminent element emphasizes the influence of the Internet of Things (IoT). Most of the existing systems are concede spatial coordinates or semantic keywords to retrieve the entail data, which are not comprehensive constraints because of sensor cohesion, unique localization haphazardness. To address this issue, we propose deep learning inspired sensor-rank consolidation (DLi-SRC) system that enables 3-set of algorithms. First, sensor cohesion algorithm based on Lyapunov approach to accelerate sensor stability. Second, sensor unique localization algorithm based on rank-inferior measurement index to avoid redundancy data and data loss. Third, a heuristic directive algorithm to improve entail data search efficiency, which returns appropriate ranked sensor results as per searching specifications. We examined thorough simulations to describe the DLi-SRC effectiveness. The outcomes reveal that our approach has significant performance gain, such as search efficiency, service quality, sensor existence rate enhancement by 91%, and sensor energy gain by 49% than benchmark standard approaches

    Blockchain Security Using Merkle Hash Zero Correlation Distinguisher for the IoT in Smart Cities

    No full text
    Internet of Things (IoT) data is one of the most important assets in business models for offering various ubiquitous and brilliant services. The IoT is provided with the advantage of susceptibility that cybercriminals and other malicious users. Even though smart cities are intended to extend productivity and efficiency, residents and authorities face risks when they avoid cybersecurity. The conventional blockchain methods were introduced to ensure the secure management and examination of the smart city big data. But, the blockchains are found to have computationally high costs, and failed to improve the security, not adequate resource-constrained IoT devices have been designated for smart cities. In order to address these issues, the proposed novel blockchain model called Blockchain Secured Merkle Hash Zero Correlation Distinguisher (BSMH-ZCD) is suitable for IoT devices within the cloud infrastructure. The objective of the BSMH-ZCD method is to enhance security and reduce the run time and computational overhead. Initially, the Merkle Hash tree is used to create the hash value with every transaction. Next, the Zero Correlation Distinguisher is applied to perform the data encryption and decryption operation for the ARX block for obtaining proficient secure data access in the IoT devices. Experimental assessment of the proposed BSMH-ZCD method and existing methods are carried out by using the taxi driver dataset and Novel Corona Virus2019 Dataset with different factors such as running time, computational complexity, and security with respect to a number of blocks and executions. By using the taxi driver dataset, the experimental results reveal that the BSMH-ZCD method performs better with a 19% improvement in security, 20% reduction of computational complexity, and 29% faster running time for IoT compared to existing works

    Lung cancer disease detection using service-oriented architectures and multivariate boosting classifier

    No full text
    Big data analytics in healthcare is emerging as a promising field to extract valuable information from large databases and enhance results with fewer costs. Although numerous methods have been proposed for big data analytics in the medical field, an authorized entity is required to access data, inhibiting diagnosis accuracy and efficiency. Particularly, the detection of lung cancer is critical as it is the third most common type of cancer occurring in both males and females in the US and a leading cause of cancer-related deaths worldwide, the detection of lung cancer. Therefore, this study introduces the Multivariate Ruzicka Regressed eXtreme Gradient Boosting Data Classification (MRRXGBDC) technique and service-oriented architecture (SOA) to improve the prediction accuracy and reduce the prediction time of lung cancer in big data analytics. Service-oriented architectures (SOAs) provide a set of healthcare services, where patient data are stored in the database of a physician or other certified entity. After receiving the patient data as input, several multivariate Ruzicka logistic regression trees are constructed by the physician to calculate the relationship between the dependent and independent variables. With this regression analysis, the presence or absence of disease is discovered. The experimental results reveal that the MRRXGBDC technique performs better with 10% improvement in prediction accuracy, 50% reduction of false positives, and 11% faster prediction time for lung cancer detection compared to existing works

    Linear Weighted Regression and Energy-Aware Greedy Scheduling for Heterogeneous Big Data

    No full text
    Data are presently being produced at an increased speed in different formats, which complicates the design, processing, and evaluation of the data. The MapReduce algorithm is a distributed file system that is used for big data parallel processing. Current implementations of MapReduce assist in data locality along with robustness. In this study, a linear weighted regression and energy-aware greedy scheduling (LWR-EGS) method were combined to handle big data. The LWR-EGS method initially selects tasks for an assignment and then selects the best available machine to identify an optimal solution. With this objective, first, the problem was modeled as an integer linear weighted regression program to choose tasks for the assignment. Then, the best available machines were selected to find the optimal solution. In this manner, the optimization of resources is said to have taken place. Then, an energy efficiency-aware greedy scheduling algorithm was presented to select a position for each task to minimize the total energy consumption of the MapReduce job for big data applications in heterogeneous environments without a significant performance loss. To evaluate the performance, the LWR-EGS method was compared with two related approaches via MapReduce. The experimental results showed that the LWR-EGS method effectively reduced the total energy consumption without producing large scheduling overheads. Moreover, the method also reduced the execution time when compared to state-of-the-art methods. The LWR-EGS method reduced the energy consumption, average processing time, and scheduling overhead by 16%, 20%, and 22%, respectively, compared to existing methods

    Linear Weighted Regression and Energy-Aware Greedy Scheduling for Heterogeneous Big Data

    No full text
    Data are presently being produced at an increased speed in different formats, which complicates the design, processing, and evaluation of the data. The MapReduce algorithm is a distributed file system that is used for big data parallel processing. Current implementations of MapReduce assist in data locality along with robustness. In this study, a linear weighted regression and energy-aware greedy scheduling (LWR-EGS) method were combined to handle big data. The LWR-EGS method initially selects tasks for an assignment and then selects the best available machine to identify an optimal solution. With this objective, first, the problem was modeled as an integer linear weighted regression program to choose tasks for the assignment. Then, the best available machines were selected to find the optimal solution. In this manner, the optimization of resources is said to have taken place. Then, an energy efficiency-aware greedy scheduling algorithm was presented to select a position for each task to minimize the total energy consumption of the MapReduce job for big data applications in heterogeneous environments without a significant performance loss. To evaluate the performance, the LWR-EGS method was compared with two related approaches via MapReduce. The experimental results showed that the LWR-EGS method effectively reduced the total energy consumption without producing large scheduling overheads. Moreover, the method also reduced the execution time when compared to state-of-the-art methods. The LWR-EGS method reduced the energy consumption, average processing time, and scheduling overhead by 16%, 20%, and 22%, respectively, compared to existing methods

    Hybrid Model for Security-Aware Cluster Head Selection in Wireless Sensor Networks

    No full text
    Wireless sensor network (WSN) is considered as the resource constraint network, in which the entire nodes have limited resources. In WSN, prolonging the lifetime of the network remains as the unsolved point. Accordingly, this study intends to propose a hybrid GGWSO (Grouped Grey Wolf Search Optimisation) algorithm to improve the performance of a cluster head selection in WSN, so that the network\u27s lifetime can be extended. The proposed method concerns the main constraints associated with distance, delay, energy, and security. This study compares the performance of the proposed GGWSO with several traditional algorithms like artificial bee colony (ABC), fractional ABC, group search optimisation and Grey Wolf optimisation-based cluster head selection. During the performance analysis, the various ranges of risk, such as 20, 60, and 100% are added to validate the performance variations, by evaluating the number of alive nodes, and normalised network energy remained in the network. The simulation results have shown that there is a need for a hybrid model for attaining the superior results
    corecore